Induction of Selective Bayesian Classifiers
نویسندگان
چکیده
In this paper, we examine previous work on the naive Bayesian classifier and review its limitations, which include a sensitivity to correlated features. We respond to this problem by embedding the naive Bayesian induction scheme within an algorithm that carries out a greedy search through the space of features. We hypothesize that this approach will improve asymptotic accuracy in domains that involve correlated features without reducing the rate of learning in ones that do not. We report experimental results on six natural domains, including comparisons with decision-tree induction, that support these hypotheses. In closing, we discuss other approaches to extending naive Bayesian classifiers and outline some directions for future research. 0 Also affiliated with the Robotics Laboratory, Computer Science Department, Stanford University,
منابع مشابه
Floating search algorithm for structure learning of Bayesian network classifiers
This paper presents a floating search approach for learning the network structure of Bayesian network classifiers. A Bayesian network classifier is used which in combination with the search algorithm allows simultaneous feature selection and determination of the structure of the classifier. The introduced search algorithm enables conditional exclusions of previously added attributes and/or arcs...
متن کاملBayesian Network Classifiers Versus k-NN Classifier Using Sequential Feature Selection
The aim of this paper is to compare Bayesian network classifiers to the k-NN classifier based on a subset of features. This subset is established by means of sequential feature selection methods. Experimental results show that Bayesian network classifiers more often achieve a better classification rate on different data sets than selective k-NN classifiers. The k-NN classifier performs well in ...
متن کاملComparing Case-Based Bayesian Network and Recursive Bayesian Multi-Net Classifiers
Recent work in Bayesian classifiers has shown that a better and more flexible representation of domain knowledge results in more accurate classifiers. We have recently examined a new type of Bayesian classifiers called Case-Based Bayesian Network (CBBN) classifiers. The basic idea is to partition the training data into semantically sound clusters. A local BN classifier is then learned independe...
متن کاملSupervised classification with conditional Gaussian networks: Increasing the structure complexity from naive Bayes
Most of the Bayesian network-based classifiers are usually only able to handle discrete variables. However, most real-world domains involve continuous variables. A common practice to deal with continuous variables is to discretize them, with a subsequent loss of information. This work shows how discrete classifier induction algorithms can be adapted to the conditional Gaussian network paradigm ...
متن کاملLearning Selectively Conditioned Forest Structures with Applications to DNBs and Classification
Dealing with uncertainty in Bayesian Network structures using maximum a posteriori (MAP) estimation or Bayesian Model Averaging (BMA) is often intractable due to the superexponential number of possible directed, acyclic graphs. When the prior is decomposable, two classes of graphs where efficient learning can take place are treestructures, and fixed-orderings with limited in-degree. We show how...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1994